Перевод: с русского на английский

с английского на русский

average entropy

См. также в других словарях:

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Entropy (statistical thermodynamics) — In thermodynamics, statistical entropy is the modeling of the energetic function entropy using probability theory. The statistical entropy perspective was introduced in 1870 with the work of the Austrian physicist Ludwig Boltzmann. Mathematical… …   Wikipedia

  • Entropy rate — The entropy rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H(X) is the limit of the joint entropy of n members of …   Wikipedia

  • Introduction to entropy — Thermodynamic entropy provides a measure of certain aspects of energy in relation to absolute temperature. The thermodynamic entropy S, often simply called the entropy in the context of thermodynamics, is a measure of the amount of energy in a… …   Wikipedia

  • Topological entropy — In mathematics, the topological entropy of a topological dynamical system is a nonnegative real number that measures the complexity of the system. Topological entropy was first introduced in 1965 by Adler, Konheim and McAndrew. Their definition… …   Wikipedia

  • Free entropy — A thermodynamic free entropy is an entropic thermodynamic potential analogous to the free energy. Also know as a Massieu, Planck, or Massieu Planck potentials (or functions), or (rarely) free information. In statistical mechanics, free entropies… …   Wikipedia

  • Principle of maximum entropy — This article is about the probability theoretic principle. For the classifier in machine learning, see maximum entropy classifier. For other uses, see maximum entropy (disambiguation). Bayesian statistics Theory Bayesian probability Probability… …   Wikipedia

  • Maximum entropy probability distribution — In statistics and information theory, a maximum entropy probability distribution is a probability distribution whose entropy is at least as great as that of all other members of a specified class of distributions. According to the principle of… …   Wikipedia

  • Differential entropy — (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Contents 1 Definition 2… …   Wikipedia

Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»